Testing Predictor Contributions in Sufficient Dimension Reduction

نویسنده

  • R. Dennis Cook
چکیده

We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower dimensional version without loss of information on the regression. Methodology using sliced inverse regression is developed in detail.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On model-free conditional coordinate tests for regressions

Existing model-free tests of the conditional coordinate hypothesis in sufficient dimension reduction (Cook (1998) [3]) focused mainly on the first-order estimation methods such as the sliced inverse regression estimation (Li (1991) [14]). Such testing procedures based on quadratic inference functions are difficult to be extended to second-order sufficient dimension reduction methods such as the...

متن کامل

Sufficient dimension reduction for the conditional mean with a categorical predictor in multivariate regression

Recent sufficient dimension reduction methodologies in multivariate regression do not have direct application to a categorical predictor. For this, we define the multivariate central partial mean subspace and propose two methodologies to estimate it. The first method uses the ordinary least squares. Chi-squared distributed statistics for dimension tests are constructed, and an estimate of the t...

متن کامل

Sufficient dimension reduction in regressions across heterogeneous subpopulations

Sliced inverse regression is one of the widely used dimension reduction methods. Chiaromonte and co-workers extended this method to regressions with qualitative predictors and developed a method, partial sliced inverse regression, under the assumption that the covariance matrices of the continuous predictors are constant across the levels of the qualitative predictor. We extend partial sliced i...

متن کامل

A robust inverse regression estimator

A family of dimension reduction methods was developed by Cook and Ni [Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. J. Amer. Statist. Assoc. 100, 410–428.] via minimizing a quadratic objective function. Its optimal member called the inverse regression estimator (IRE) was proposed. However, its calculation involves higher order moments of the predictors. ...

متن کامل

Estimating the Central Kth Moment Space via an Extension of Ordinary Least Squares

Various sufficient dimension reduction methods have been proposed to find linear combinations of predictor X, which contain all the regression information of Y versus X. If we are only interested in the partial information contained in the mean function or the kth moment function of Y given X, estimation of the central mean space (CMS) or the central kth moment space (CKMS) becomes our focus. H...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004